DUAL PRINCIPAL COMPONENT PURSUIT Dual Principal Component Pursuit
نویسندگان
چکیده
We consider the problem of outlier rejection in single subspace learning. Classical approaches work with a direct representation of the subspace, and are thus efficient when the subspace dimension is small. Our approach works with a dual representation of the subspace and hence aims to find its orthogonal complement; as such it is particularly suitable for high-dimensional subspaces. We pose the problem of computing normal vectors to the subspace as a non-convex `1 minimization problem on the sphere, which we call Dual Principal Component Pursuit (DPCP). We provide theoretical guarantees, under which every global solution of DPCP is a vector in the orthogonal complement of the inlier subspace. Moreover, we relax the non-convex DPCP problem to a recursion of linear programming problems, which, as we show, converges in a finite number of steps to a vector orthogonal to the subspace. In particular, when the inlier subspace is a hyperplane, then the linear programming recursion converges in a finite number of steps to the global minimum of the non-convex DPCP problem. We propose algorithms based on alternating minimization and Iteratively Reweighted Least-Squares, that are suitable for dealing with large-scale data. Extensive experiments on synthetic data show that the proposed methods are able to handle more outliers and higher-dimensional subspaces than the state-of-the-art methods, while experiments with real face and object images show that our DPCP-based methods are competitive to the state-of-the-art.
منابع مشابه
Robust Principal Component Analysis by Projection Pursuit
Different algorithms for principal component analysis (PCA) based on the idea of projection pursuit are proposed. We show how the algorithms are constructed, and compare the new algorithms with standard algorithms. With the R implementation pcaPP we demonstrate the usefulness at real data examples. Finally, it will be outlined how the algorithms can be used for robustifying other multivariate m...
متن کاملAlgorithms for projection-pursuit robust principal component analysis
Principal Component Analysis (PCA) is very sensitive in presence of outliers. One of the most appealing robust methods for principal component analysis uses the Projection-Pursuit principle. Here, one projects the data on a lower-dimensional space such that a robust measure of variance of the projected data will be maximized. The Projection-Pursuit based method for principal component analysis ...
متن کاملSolving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers
Abstract. In this paper, we consider solving multiple-block separable convex minimization problems using alternating direction method of multipliers (ADMM). Motivated by the fact that the existing convergence theory for ADMM is mostly limited to the two-block case, we analyze in this paper, both theoretically and numerically, a new strategy that first transforms a multiblock problem into an equ...
متن کاملA Characterization of Principal Components for Projection Pursuit
Principal Component Analysis is a technique often found to be useful for identifying structure in multivariate data. Although it has various characterizations (Rao 1964), the most familiar is as a variance-maximizing projection. Projection pursuit is a methodology for selecting low-dimensional projections of multivariate data by the optimization of some index of \interestingness" over all proje...
متن کاملAn Improved Traffic Matrix Decomposition Method with Frequency-Domain Regularization
We propose a novel network traffic matrix decomposition method named Stable Principal Component Pursuit with FrequencyDomain Regularization (SPCP-FDR), which improves the Stable Principal Component Pursuit (SPCP) method by using a frequency-domain noise regularization function. An experiment demonstrates the feasibility of this new decomposition method. key words: Traffic Matrix, Stable Princip...
متن کامل